Cache repeated directory lookups for 12% speedup #356
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
While profiling eslint tasks in various projects, I noticed that the polaris-related linting rules we have, take up a more time than necessary. This is most noticeable when when the project contains lots of files.
The majority of the time in those two rules is spent in
normalizeSource
which traverses directories upwards to find the npm package root. This is perfectly fine if we haven't seen the file before, but unnecessarily expensive when we did resolve it once already. In the projects I tried this PR on, 95% of calls tonormalizeSource
are with a source file that we resolved already.Therefore adding a simple caching mechanism bypasses the directory traversal and leads to about 12% faster linting times on the project I've tested it on.